What is this?

Gradient descent is an optimization algorithm used to find the minimum of a given objective function. It does this by iteratively making small changes to the input parameters, known as ‘take steps’, in a direction such that the reduction of the objective function is maximized. Gradient descent is a powerful tool that is used extensively in the fields of machine learning, deep learning and other areas of optimization.

See also: neural network, self-organization, free will, cognitive science

EP87 Joscha Bach on Theories of Consciousness 7,303

Currents 078: John Ash on AI Art Tools 176

EP137 Ken Stanley on Neuroevolution 171

Currents 036: Melanie Mitchell on Why AI is Hard 90

EP33 Melanie Mitchell on the Elements of AI 53

EP25 Gary Marcus on Rebooting AI 44